# Large Context Window
Gemma 3 4B It Qat GGUF
The Gemma 3 4B IT model by Google supports multimodal input and long-context processing, suitable for text generation and image understanding tasks.
Image-to-Text
G
lmstudio-community
46.55k
10
EEVE Korean Instruct 10.8B V1.0 Gguf
Apache-2.0
EEVE-Korean-Instruct-10.8B-v1.0 is a Korean instruction fine-tuned large language model, developed based on the yanolja/EEVE-Korean-10.8B-v1.0 foundation model, specializing in Korean language understanding and generation tasks.
Large Language Model
E
teddylee777
626
21
Nekomata 14b
Other
A large language model continuously pre-trained on a mixed Japanese and English dataset based on Qwen-14B, significantly improving performance on Japanese tasks
Large Language Model
Transformers Supports Multiple Languages

N
rinna
705
20
Colossal LLaMA 2 7b Base
An open-source bilingual Chinese-English large language model based on LLaMA-2, continuously pre-trained on approximately 8.5 billion tokens, supporting a context window of 4096 tokens.
Large Language Model
Transformers Supports Multiple Languages

C
hpcai-tech
147
76
Long T5 Tglobal Large Pubmed 3k Booksum 16384 WIP15
Bsd-3-clause
A large-scale summarization model based on the Long-T5 architecture, specifically optimized for book and long document summarization tasks
Text Generation
Transformers

L
pszemraj
17
0
Long T5 Tglobal Base 16384 Booksum V12
Bsd-3-clause
An optimized long-text summarization model based on the T5 architecture, capable of processing inputs up to 16,384 tokens, excelling in book summarization tasks.
Text Generation
Transformers

L
pszemraj
109
4
Long T5 Tglobal Base 16384 Book Summary
Bsd-3-clause
A book summarization model based on the Long-T5 architecture, capable of processing long documents and generating high-quality summaries.
Text Generation
L
pszemraj
24.19k
134
Long T5 Tglobal Large Pubmed 3k Booksum 16384 WIP
Apache-2.0
A large-scale summarization model based on the Long-T5 architecture, specifically optimized for long-document summarization tasks, supporting a context length of 16,384 tokens.
Text Generation
Transformers

L
pszemraj
65
1
Featured Recommended AI Models